155 research outputs found

    Data Mining

    Get PDF

    Phase space techniques in neural network models

    Get PDF

    The Role of Staff Development in School-Based Management

    Get PDF
    Today’s frontier is knowledge. Brain has taken precedence over brawn; our physical struggle for existence has been replaced by intellectual struggle, and knowledge of words has become the most valuable tool for this struggle. Words are the very cornerstone of any language. With a good vocabulary, which indicates scope of knowledge, we can grasp the thoughts of others and be able to communicate our own thoughts to them. As Stahl (1999) argued, discussion of words is discussion of knowledge of the world, and knowledge of the world is knowledge of who we are and where we stand in the world. Also, the importance of words in foreign and second language learning is beyond question. Vocabulary knowledge is one of the language skills crucial for fluent language use (Nation, 1993). Vocabulary size is an indicator of how well the second language (L2) learners can perform academic language skills such as, reading, listening, and writing (Bear, Invernizzi, Templeton and Johnston, 2008; Treiman and Casar, 1996). Numerous studies have documented the strong and reciprocal relationship between vocabulary knowledge and reading comprehension (Baker, Simmons, & Kame’enui, 1995; Beck, McKeown, & Kucan, 2002; Graves, 2000; Stahl & Fairbanks, 1987;) as well as general reading ability (Stanovich, Cunningham, & West, 1998). Likewise, Saville-Troike, (1984) concluded that vocabulary knowledge is the single best predictor of students’ academic achievement across subject matter domains. Also, there is a strong agreement among researchers that promoting vocabulary growth is an important and often neglected component of a comprehensive reading program (Baumann & Kame’enui, 2004; National Reading Panel, 2000; Vaezi & Fallah, 2010)

    Topological Properties of SU(n) Fermions

    Get PDF
    Ultra-cold fermions loaded in optical lattices have become ideal systems to study related electronic phase diagrams and transport properties, because they provide a clean and well controlled playground to change various lattice parameters and external fields at the turn of a knob. It is now possible to create artificial magnetic fields in optical lattices that mimic electronic materials exhibiting integer and fractional quantum Hall effects. The synthetic magnetic flux values created in optical lattices are sufficiently large to allow for the experimental exploration of the intricacies of Harper’s model and the Hofstadter butterfly, as well as the experimental determination of Chern numbers. For ultracold fermions in optical lattices, artificial magnetic fields enable studies of topological insulators that break time-reversal symmetry, such as quantum Hall systems, while artificial spin-orbit fields allow for studies of topological insulators that do not break time-reversal symmetry, such as quantum spin Hall systems. Both types of topological insulators are characterized by Berry curvatures and Chern numbers, which have been measured experimentally using time-of-flight techniques, inspired by theoretical proposals, and using dynamics of the center of mass of the atomic cloud, also motivated by theoretical work. However, studies of ultracold fermions may go beyond the quantum simulation of spin-1/2 topological insulators under typical condensed matter conditions, because artificial magnetic, spin-orbit, and Zeeman fields may be adjusted independently. The thesis develop the topological properties and discuss the quantum Hall responses of SU(N) fermions in two-dimensional lattices, when artificial magnetic flux and color-orbit coupling are present.Ph.D

    The Simultaneous Impact of Supplier and Customer Involvement on New Product Performance

    Get PDF
    The recall rates of various products these years have triggered a new round of interests in the impacts of supplier involvement (SI) and customer involvement (CI) on new product performance (NPP). However, existing literature looks at either SI or CI but not both. Most supply chain management papers focus on SI and NPP while research in marketing field focuses on CI and NPP. Additionally, the NPP has not been elaborated into detail dimensions in these previous studies. This research investigates the impact of both SI and CI on the three dimensions of NPP, namely new product quality and reliability, time to market and product innovativeness. The research was based on the data from over 600 manufacturers in 21 countries. Structural equation modeling (SEM) is used to test the simultaneous impact of SI and CI on NPP. The results show that SI influences all the three dimensions of NPP while CI influences quality and reliability. The research also reveals that companies pay more attention to CI than SI. It seems that more efforts in both academic and practical fields are needed to enhance SI in relation to NPD. The research suggests that both SI and CI should be implemented in new product development process. It is not a two-party issue but a three-party-issue

    Evaluation of High Performance Fortran through Application Kernels

    Get PDF
    Since the definition of the High Performance Fortran (HPF) standard, we have been maintaining a suite of application kernel codes with the aim of using them to evaluate the available compilers. This paper presents the results and conclusions from this study, for sixteen codes, on compilers from IBM, DEC, and the Portland Group Inc. (PGI), and on three machines: a DEC Alphafarm, an IBM SP-2, and a Cray T3D. From this, we hope to show the prospective HPF user that scalable performance is possible with modest effort, yet also where the current weaknesses lay

    Exploration of Emerging HPCN Technologies for Web-Based Distributed Computing

    Get PDF
    The surge in the popularity of the World Wide Web (WWW) has corresponded to a decreasing market for specialised high performance computers. This paper discusses how, by making use of technology developed from the broader end of the computing pyramid, much of the past decade\u27s work in distributed computing can be realised in the context of the larger WWW market. Not only do these new technologies offer fresh possibilities, but their pace of development is unlikely to be matched by the traditional high performance research community. A motivating application, discussions of the pertinent emerging technologies, and NPAC\u27s investigations of them, will be presented

    Cluster Computing Review

    Get PDF
    In the past decade there has been a dramatic shift from mainframe or ‘host−centric’ computing to a distributed ‘client−server’ approach. In the next few years this trend is likely to continue with further shifts towards ‘network−centric’ computing becoming apparent. All these trends were set in motion by the invention of the mass−reproducible microprocessor by Ted Hoff of Intel some twenty−odd years ago. The present generation of RISC microprocessors are now more than a match for mainframes in terms of cost and performance. The long−foreseen day when collections of RISC microprocessors assembled together as a parallel computer could out perform the vector supercomputers has finally arrived. Such high−performance parallel computers incorporate proprietary interconnection networks allowing low−latency, high bandwidth inter−processor communications. However, for certain types of applications such interconnect optimization is unnecessary and conventional LAN technology is sufficient. This has led to the realization that clusters of high−performance workstations can be realistically used for a variety of applications either to replace mainframes, vector supercomputers and parallel computers or to better manage already installed collections of workstations. Whilst it is clear that ‘cluster computers’ have limitations, many institutions and companies are exploring this option. Software to manage such clusters is at an early stage of development and this report reviews the current state−of−the−art. Cluster computing is a rapidly maturing technology that seems certain to play an important part in the ‘network−centric’ computing future
    • 

    corecore